-
Notifications
You must be signed in to change notification settings - Fork 2.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
openai-experimental[patch]: Adds experimental raw response field to OpenAI chat models #6087
Conversation
The latest updates on your projects. Learn more about Vercel for Git ↗︎
|
@@ -28,6 +28,24 @@ import { AzureChatOpenAI } from "../../azure/chat_models.js"; | |||
// Save the original value of the 'LANGCHAIN_CALLBACKS_BACKGROUND' environment variable |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hey team, I've flagged the added beforeAll
block in the PR for review as it explicitly sets environment variables using process.env
. This change should be reviewed to ensure it aligns with our environment variable management practices.
@@ -5,6 +5,24 @@ import { AIMessageChunk } from "@langchain/core/messages"; | |||
import { AzureChatOpenAI } from "../../azure/chat_models.js"; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hey team, I've flagged the recent change in the PR for review as it explicitly accesses and sets environment variables using process.env
. Please take a look and ensure it aligns with our best practices for handling environment variables. Thanks!
@@ -1,6 +1,24 @@ | |||
import { test, expect } from "@jest/globals"; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hey there! I noticed that the recent PR includes changes to access and set environment variables using process.env
. This comment is to flag the change for maintainers to review the handling of environment variables in the code. Great work on the PR!
@@ -15,6 +15,24 @@ import { AzureOpenAI } from "../../azure/llms.js"; | |||
// Save the original value of the 'LANGCHAIN_CALLBACKS_BACKGROUND' environment variable |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hey team, just a heads up that I've flagged the recent changes in the llms.int.test.ts
file for review. The added beforeAll
block sets environment variables based on corresponding TEST_
prefixed variables, so it's important to ensure everything is handled correctly. Let me know if you have any questions!
…jacob/experimental_raw_response
Passing
__includeRawResponse: true
on instantiation will return the raw response from OpenAI inadditional_kwargs
.It's not (and probably won't ever be) exactly 1:1 with OpenAI - we add an extra metadata chunk at the end rather than yielding when OpenAI emits it. The final result of chunk concatenation also doesn't make sense, since it doesn't make sense to merge multiple OpenAI streaming responses.